A Smoothing Regularizer for Feedforward and Recurrent Neural Networks

ثبت نشده
چکیده

An empirical test of new forecasting methods derived from a theory of intelligence: the prediction of conflict in Latin America', 'Neurons with graded response have collective computational properties like those of two-state neurons',

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Smoothing Regularizer for Recurrent Neural Networks

We derive a smoothing regularizer for recurrent network models by requiring robustness in prediction performance to perturbations of the training data. The regularizer can be viewed as a generalization of the first order Tikhonov stabilizer to dynamic models. The closed-form expression of the regularizer covers both time-lagged and simultaneous recurrent nets, with feedforward nets and onelayer...

متن کامل

Tikhonov Regularization for Long Short-Term Memory Networks

It is a well-known fact that adding noise to the input data often improves network performance. While the dropout technique may be a cause of memory loss, when it is applied to recurrent connections, Tikhonov regularization, which can be regarded as the training with additive noise, avoids this issue naturally, though it implies regularizer derivation for different architectures. In case of fee...

متن کامل

Estimating Tract Variables from Acoustics via Neural Networks

This project aims to effectively estimate vocal constriction tract variable (TV) trajectories using artificial neural networks. These trajectories can then be included as input in automatic speech recognition systems with the hope of improving their performance in the presence of coarticulation. Two types of artificial neural networks will be implemented: a feedforward artificial neural network...

متن کامل

Equivalence Results between Feedforward and Recurrent Neural Networks for Sequences

In the context of sequence processing, we study the relationship between single-layer feedforward neural networks, that have simultaneous access to all items composing a sequence, and single-layer recurrent neural networks which access information one step at a time. We treat both linear and nonlinear networks, describing a constructive procedure, based on linear autoencoders for sequences, tha...

متن کامل

A guide to recurrent neural networks and backpropagation

This paper provides guidance to some of the concepts surrounding recurrent neural networks. Contrary to feedforward networks, recurrent networks can be sensitive, and be adapted to past inputs. Backpropagation learning is described for feedforward networks, adapted to suit our (probabilistic) modeling needs, and extended to cover recurrent networks. The aim of this brief paper is to set the sce...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996